Alleged copyright infringements in the training of AI models, compulsory mediation and the boundaries of Brexit are among the recent landmark developments exercising IP lawyers. Joanna Goodman reports
The low down
In the early weeks of 2025, IP lawyers have seen the first ruling in the UK to require compulsory mediation, and an important and high-profile decision in a lookalike products dispute. Landmark cases are also proliferating that will partially determine the scope and scale of the rollout of artificial intelligence, as copyright owners battle the allegedly illegal use of their content to train AI models. Thomson Reuters scored a ‘stunning’ win just last week on fair use of the legal information platform Westlaw, but is the judgment really as significant as it appears? Across the Atlantic, too, president Trump’s aversion to multilateralism has raised fears about the future of cross-border enforcement.
2025 is already a significant year for intellectual property law, due to the intensifying debate over the use of copyright material to train generative AI models and tensions between different IP regimes. Significant cases are bringing clarity to a confusing landscape. In the US, however, president Trump is less than keen on international organisations, including IP courts and institutions, whose rulings have extra-territorial effect.
A major concern for the creative industries is the extent to which copyright works can be used to train AI models. Governance and regulation means balancing the interests of AI developers and copyright owners. While prime minister Sir Keir Starmer plans to turbocharge the UK’s AI sector, the UK’s creative industries bring £126bn to the UK economy and employ 2.4 million people.
The government’s consultation on text and data mining (TDM), open until 25 February, asks for feedback on potential options. These include: leaving copyright laws unchanged; an opt-in model requiring licensing, which would strengthen copyright protection but conflict with the government’s AI plans; a broad TDM exception, which would provide no protection for the creative industries; and the government’s preferred option, an opt-out model. This would allow AI developers to train their models on copyrighted content, introduce transparency measures requiring developers to reveal what material their AI models have been trained on, and give copyright owners the opportunity to opt out. This last option would reflect the Digital Copyright Directive ((EU) 2019/790.
Pushback from the creative industries was echoed during the report stage of the Data (Use and Access) Bill. This received its second reading in the Commons last week, having been significantly amended in the Lords, where it originated. The amendments, introduced by film-maker Baroness Kidron, propose an alternative, enforceable copyright regime.
Advert
She explained: ‘Amendment 61 would ensure that all operators of web crawlers must comply with UK law if they are marketed in the UK. Amendments 62 and 63 would require operators to be transparent about their identity and purpose, and allow creatives to understand if their content had been stolen. Amendment 64 would give enforcement powers to the ICO (Information Commissioner’s Office) and allow for a private right of action by copyright holders. Amendment 44A would require the ICO to report on its enforcement record. Finally, Amendment 65 would require the secretary of state to review technical solutions that might support a strong copyright regime.’
She added: ‘My amendments mandate that companies have to account for where and when they take the material and make it transparent. It makes copyright law fit for the age of AI. It makes tech accountable.’
As Nina O’Sullivan, partner and head knowledge lawyer (contentious) at Mishcon de Reya, observes, this raises challenges for the government beyond territorial boundaries. ‘While the government has said that its preferred approach of an exception for text and data mining with rights-holder opt-out aligns with the approach already adopted the EU, there is one crucial difference. The EU regime purports to implement extra-territoriality – requiring those putting general purpose AI models on the market in the EU to comply with EU copyright law wherever the models were trained. The UK consultation merely says that the government wants to “encourage” AI developers operating in the UK to comply with UK law on AI model training, even where their models are trained in other countries, but with little information as to how this encouragement may bear fruit.
‘Interestingly, one of the proposed amendments to the Data (Use and Access) Bill requires model developers to comply with UK copyright law regardless of where the … pre-training, development and operation take place. Of course, it is likely that these amendments will not survive, but the government will need to face this issue head on when it responds to the consultation.’